Differentially private data aggregating with relative error constraint
نویسندگان
چکیده
Abstract Privacy preserving methods supporting for data aggregating have attracted the attention of researchers in multidisciplinary fields. Among advanced methods, differential privacy (DP) has become an influential mechanism owing to its rigorous guarantee and high utility. But DP no limitation on bound noise, leading a low-level Recently, investigate how while limiting relative error fixed bound. However, these schemes destroy statistical properties, including mean, variance MSE, which are foundational elements analyzing. In this paper, we explore optimal solution, novel definitions implementing mechanisms, maintain properties satisfying with Experimental evaluation demonstrates that our outperforms current terms security utility large quantities queries.
منابع مشابه
Differentially Private Local Electricity Markets
Privacy-preserving electricity markets have a key role in steering customers towards participation in local electricity markets by guarantying to protect their sensitive information. Moreover, these markets make it possible to statically release and share the market outputs for social good. This paper aims to design a market for local energy communities by implementing Differential Privacy (DP)...
متن کاملDifferentially Private Trajectory Data Publication
With the increasing prevalence of location-aware devices, trajectory data has been generated and collected in various application domains. Trajectory data carries rich information that is useful for many data analysis tasks. Yet, improper publishing and use of trajectory data could jeopardize individual privacy. However, it has been shown that existing privacy-preserving trajectory data publish...
متن کاملDifferentially private Bayesian learning on distributed data
Many applications of machine learning, for example in health care, would benefit from methods that can guarantee privacy of data subjects. Differential privacy (DP) has become established as a standard for protecting learning results, but the proposed algorithms require a single trusted party to have access to the entire data, which is a clear weakness. We consider DP Bayesian learning in a dis...
متن کاملDifferentially Private Publication of Sparse Data
The problem of privately releasing data is to provide a version of a dataset without revealing sensitive information about the individuals who contribute to the data. The model of differential privacy allows such private release while providing strong guarantees on the output. A basic mechanism achieves differential privacy by adding noise to the frequency counts in the contingency tables (or, ...
متن کاملDifferentially Private Data Release through Multidimensional Partitioning
Differential privacy is a strong notion for protecting individual privacy in privacy preserving data analysis or publishing. In this paper, we study the problem of differentially private histogram release based on an interactive differential privacy interface. We propose two multidimensional partitioning strategies including a baseline cell-based partitioning and an innovative kd-tree based par...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Complex & Intelligent Systems
سال: 2021
ISSN: ['2198-6053', '2199-4536']
DOI: https://doi.org/10.1007/s40747-021-00550-3